AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
32B parameter large model

# 32B parameter large model

GLM 4 32B Base 0414
MIT
GLM-4-32B-Base-0414 is a large language model with 32 billion parameters, pre-trained on 15T high-quality data, supporting both Chinese and English, and excels in tasks such as code generation and function calling.
Large Language Model Transformers Supports Multiple Languages
G
THUDM
995
21
Virtuoso Medium V2
Apache-2.0
A 32-billion-parameter language model based on Qwen-2.5-32B architecture, trained through Deepseek-v3 distillation, demonstrating excellent performance in multiple benchmarks.
Large Language Model Transformers
V
arcee-ai
412
51
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase